WHORU: Improving Abstractive Dialogue Summarization with Personal Pronoun Resolution
نویسندگان
چکیده
With the abundance of conversations happening everywhere, dialogue summarization plays an increasingly important role in real world. However, dialogues inevitably involve many personal pronouns, which hinder performance existing models. This work proposes a framework named WHORU to inject external pronoun resolution (PPR) information into abstractive A simple and effective PPR method for domain is further proposed reduce time space consumption. Experiments demonstrated superiority methods. More importantly, achieves new SOTA results on SAMSum AMI datasets.
منابع مشابه
Towards Improving Abstractive Summarization via Entailment Generation
Abstractive summarization, the task of rewriting and compressing a document into a short summary, has achieved considerable success with neural sequence-tosequence models. However, these models can still benefit from stronger natural language inference skills, since a correct summary is logically entailed by the input document, i.e., it should not contain any contradictory or unrelated informat...
متن کاملTL;DR: Improving Abstractive Summarization Using LSTMs
Traditionally, summarization has been approached through extractive methods. However, they have produced limited results. More recently, neural sequence-tosequence models for abstractive text summarization have shown more promise, although the task still proves to be challenging. In this paper, we explore current state-of-the-art architectures and reimplement them from scratch. We begin with a ...
متن کاملDialogue Structure and Pronoun Resolution
This paper presents an empirical evaluation of a pronoun resolution algorithm augmented with discourse segmentation information. Past work has shown that segmenting discourse can aid in pronoun resolution by making potentially erroneous candidates inaccessible to a pronoun’s search. However, implementing this in practice has been difficult given the complexities associated with deciding on a us...
متن کاملImproving Neural Abstractive Text Summarization with Prior Knowledge (Position Paper)
Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. In this position paper we address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNN). Moreover, we discus...
متن کاملDialogue focus tracking for zero pronoun resolution
We take a novel approach to zero pronoun resolution in Chinese: our model explicitly tracks the flow of focus in a discourse. Our approach, which generalizes to deictic references, is not reliant on the presence of overt noun phrase antecedents to resolve to, and allows us to address the large percentage of “non-anaphoric” pronouns filtered out in other approaches. We furthermore train our mode...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronics
سال: 2023
ISSN: ['2079-9292']
DOI: https://doi.org/10.3390/electronics12143091